# Multilingual LLM
Llama 3.1 Swallow 8B Instruct V0.2
Llama 3.1 Swallow is a series of large language models that are continuously pre - trained based on the Meta Llama 3.1 model, enhancing Japanese capabilities while retaining English capabilities.
Large Language Model
Transformers Supports Multiple Languages

L
tokyotech-llm
2,283
15
Malayallm 7B Base
Apache-2.0
MalayaLLM is a language model focused on generative AI for Malayalam, built through continued pre-training on the LLAMA2 model with the addition of approximately 18,000 Malayalam vocabulary entries.
Large Language Model
Transformers Supports Multiple Languages

M
VishnuPJ
21
5
Tamil Llama 7b Base V0.1
A 7-billion-parameter Tamil language model based on LLaMA-2 architecture, supporting causal language modeling in Tamil and English
Large Language Model
Transformers Supports Multiple Languages

T
abhinand
761
10
Featured Recommended AI Models